Part 18: Last Words
There is goon participation in this chapter!

: Hello again, Ezra.

: Yeah, hi. Let's get straight to the point. I'm having a hell of a day right now.

: If you would like. Allow me a moment.

: ...

: Actually.

: Yes, Ezra?

: Gabriel, the officer that brought you here... he said that you personally insisted on bringing these messages to me and my Dad rather than upload them to a codex for them to deliver instead, or whatever.

: That's not something that AI's do. They don't 'insist'. Especially not an AI meant for combat. You were designed to help Mom pull a trigger and to keep her oxygen systems working.

: I... I don't know if I actually care or not. There's so much going through my head right now. But maybe having one mystery solved will help me get myself back on track.

: What is your question?

: My 'question' is... how the hell are you doing that? You're a machine. You're barely even
that; you're software, lines of code. It shouldn't be possible for you to have an opinion.

: It is not impulse that drove me to you and your father, Ezra. Your mother was my pilot; she had given me the directive to deliver you the messages. I am simply following her wishes to their exact specifications.
It really must be strange for a dead soldier's wishes to be followed in the future, or maybe it's just because an AI is trying to transmit the information.

: Gabriel told me that you 'refused to cooperate.' I would have thought that someone like him would have outranked whatever it is my mom wanted.

: Gabriel is not my pilot.

:
That... that sounds like a dog's way of thinking. A dog would listen to its owner, not its owner's boss.

: Do you understand that you
should have done as Gabriel said? That there was a chain of command that you should have followed?

: Your mother had given me the directive to deliver you and your father the messages. I am simply following her wishes to their exact specifications.

: ... What if, hypothetically, I told you
not to deliver my message?

: You are not my pilot.

:
Ugh, we're just going in circles.
Honestly, a good answer. Kirby the AI has a mission, and by Wozniak, he will complete it.

: Alright, whatever. I'm ready to hear

: I'm... I'm ready. As I'll ever be.

: Understood.
(Eden's entire message is spoken. You may listen to it here.)
A Mother's Farewell

:

If you're receiving this message, it means that the mission I'm going to undertake tomorrow has... gone sideways.

:

I'm sorry I couldn't be with you now. I'm sure you had the house all tidied up for me and you and Nathan were looking forward to being a family again. I had been counting the days we could be together again, myself.

:

I'm also sorry that I didn't tell you and Nathan more about the mission that I will be taking on tomorrow.

:

The risks were high; the last thing I wanted was to give you and your father more stress than we all already have, especially since I, well... wouldn't have called you afterward.

:
She's dead, and she's the one apologizing for it...

:
I guess that sounds like Mom.

:

But I'm
also sure that you're still recovering from the news, and the last thing you need to hear right now is your old lady bringing you down about things that couldn't be. Life is hard enough without being guilt-tripped by your mother from beyond the grave.

: ... Heh. Heh heh.

:
That sounds like her, too.

:

It's pretty unusual for the mother of the family to be the one to jump into the spaceship and suit up to fight in a war against some evil invading aliens, isn't it?

:

As far as I'm aware, you've never had this conversation with your father, but the whole reason why I was the soldier and not him is only because it was something that I wanted to do. I never stopped Nathan from enlisting for field work. He could have if he wanted.

:

We both knew, though, that he was a software engineer, not a fighter. He was better put to work elsewhere. Don't tell him I said this to you, but, between you and me, I really hope he doesn't feel less like a man over it. Whatever that means in the modern year.

:

It'll sound like a cliché, I'm sure, but... the reason why I enlisted so eagerly was because I wanted to do my part to secure a future for
you. You're still only in your twenties you're an adult, but you're still young. You can be both.

:

As far as I'm aware, you haven't had any kids while I wasn't looking. If you ever do someday, then you'll understand the kind of impulse that went through me when the Riklid first attacked.

:

I love you, Ezra. You and Nathan mean the world to me. I was fighting for you two first and foremost to me: Earth, Mars, and Titan all came second. That's why I didn't hesitate to enlist; and that's why I didn't hesitate to accept this mission.

:

...

:

Since I can't look into the future as convenient as it would be I can't tell what the outcome of the mission is. Only that it, well, ended a certain way, since you're hearing this.

:

If, on the likely chance that the war is still going on, I want you to know that I'm proud of you and your efforts in the field of robotics and electrical engineering. I wouldn't have made it half as long as I have without Kirby; I hope you can make many more robots like him.

:

You've always had a knack for taking things apart and putting them back together, and it's not a surprise to me that you're doing well with your studies. You've been a vigilant student and, if I'm honest, I'm glad you're not training to do something that puts you on the front lines.

:

You've had to surrender your teenage years for this damn war, and it breaks my heart to know that you'll never get them back, and that the Riklid will keep attacking for who-knows-how-long. But it's also something that we
all have to do, and...

:

Well, actually, saying it doesn't make it feel any better. Everybody suffering equally also doesn't make it better.

:

I don't want to ramble, so, the point is: you have your father's smarts, Ezra. Your work in robotics and electrical engineering would revolutionize the fight.

:

I can hear it in your voice when you talk about your studies that it's not a duty to you: it's a passion. It's something that you were meant to do. If the war is still on when you hear this, then I want you to know that we, as a species, will go farther thanks to you and your work.

: ...

:

That is, however,
if the war is still on.

:

In case it isn't, I have just one thing I want you to do. Consider it... an order, as your mother. Like I'm about to tell you to do the dishes or something. I don't know, I'm not a poet.

: Heh heh...

:

Robotics and electronics are your passion, and you're incredible at it. And it's for that exact reason that I have a gut feeling that you'll be hounded for it. By your school, your community, the military... who knows?

:

But I know that they'll come after you and they'll try to coerce you into keeping it up. They'll want you to keep up your studies with your robotics; they'll try to guilt trip you into it by saying that the war isn't
actually over or some nonsense.

:

They might even try to say some bullshit about how it would be disrespectful to me and my legacy if you don't.

:

Ezra, I did not separate from you and Nathan to put myself on the front lines for thirteen years just for you to constantly live someone else's life. If the war is over by the time you get this message or sometime soon afterward my one wish is that you live your life for
yourself.

:

... It would be hypocritical of me to say that you absolutely should
not enlist, though. I jumped into the deep end because I had something to protect. Maybe, one day, you'll be put in the same spot and you'll make the same decision.

:

I just God, I need to keep a straight thought, here I just want you to make that decision yourself. I don't want you to feel like you
need to do anything in particular just because of this damn war. I don't want you to feel like you
need to study robots.

:

I know I'm probably giving mixed messages, since I just said that robotics were what you were meant to do and that we'd go farther as humans if you kept at it. My point is that it's a decision I want you to be sure that
you made
for yourself.

:

You'll be in a vulnerable position. It'll be easy for someone manipulative to come to you and say this-or-that to get you to live your life one way or another. 'It's for the good of the solar system.' 'It's what your mother would have wanted.'

:

You're an adult and you've had your youth stolen from you by some aliens. This is
your time to live, now. It's not up to anyone else to decide your life for you.

:

And if they
do pull the 'it's what your mom would have wanted' malarky, I want you to leave the conversation right then and there. I'm
telling you what I want, after all. You know that better than anyone now. Don't give them the satisfaction of even looking over your shoulder.

:

In fact, I'll do you one better: if you punch them in the mouth for me, I'll... I'll... what's the opposite of 'haunt?' 'Bless,' I guess? Well, I
would do that anyway, if I could.

: Haha...!

:

The point is, you know what it is I want for you, Ezra. It's not up to anyone else but you. If you wanted to go from Galilei all the way to Ghi or beyond, that's something I want for you, too.

:

For the last thirteen years, your life was theirs for the effort. And now it's yours again.

:

You know me well enough that I'm not going to end this message with more apologizing. That's the whole reason why I opened this message with a 'sorry,' after all: so I could get it out of the way. And I'm also not going to end this message with a 'goodbye.'

:

I'm not going to tell you to 'not feel bad,' though. Take all the time you need to grieve. Loss is something that we all need to accept, and trying to keep your chin up through it is only going to keep you down later.

:

Do you remember when your grandma died? My mother? She was one-hundred-and-twelve. Which, for modern medical science, was modestly young.

:

When she passed away, we all decided to honour her passing not by mourning her death; instead, we celebrated her life. We looked through old albums of her when she was young; we ate her favourite meals; we told all our favourite stories about her.

:

I know it'll be harder since it's about me and not your grandma, but if you think you can, I want you to try and do that for me, too. I feel like, if my death was mourned, that means I didn't do enough to make it worth celebrating.

:

This is a macabre topic, I know, but it's not exactly something we can dance around forever.

:

And I can't keep this message going forever, either.

:

You're an adult now, Ezra. You're my only daughter, and it was, in equal parts, a wonder, a miracle, an honour, and a privilege to see you grow into the person you are today.

:

Take care of your father for me.

:

I love you, Ezra. I love you both.
Another of Ezra's "destinies" is available after listening to her mother's message. Eden was clear that Ezra can live her life for herself and no one else if she chose to, that it should not be a source of guilt or shame to do so, and to tell anyone who tries to manipulate or push her to continue doing robotics research to honor Eden's memory to fuck off. Ezra has always loved robotics, but who says she can't tinker with them on her own time as a hobby rather than for some nebulous idea of "the greater good of humanity?" But... could Ezra truly revolutionize robotics, or was her mother just encouraging her passion? Whatever the case may be, it's something to consider for Ezra's future...

: ...

: If you would like, Ezra, I can upload the message to your own codex for your future perusal.

: ...

: ... *sniff*

: Yes. I would like that.

: Understood. Please put your codex into Pairing Mode and place it beside this one.
Some time passes as the bar slowly fills up...

:
Wait a minute.

: Kirby, pause the upload.

: Is there an issue?

: I said pause the upload.
*Transfer has been paused.*

: Is there an issue?

: ... Maybe. You're only transferring an audio file, right?

: That is correct.

: What is its file size?

: The audio log file size is one-hundred-thirty-billion-nine-hundred-twenty-two-million

: In megabytes. What is the file size in megabytes?

: ... One-hundred-twenty-seven-thousand-eight-hundred-fifty-four megabytes.

: That's an
absurd size for an audio file.

: The audio log was created using a lossless codec for perfect clarity to ensure that no part of the message was compromised.

: You're a military-grade artificial intelligence that was specifically designed to co-pilot my mom's suit of armour. Every single part of you was created to be airtight with no frills or features beyond what was necessary.

: You don't
have a lossless, audiophile-enthusiast codec to burn hard drive space with.

: That, and the quality of that message wasn't what I would call 'crystal clear.'

:
That, and even a lossless audio file of that length wouldn't be that large
anyway. Not even close.
I did not peg Ezra to be an audiophile. That, or she hangs out with Let's Players in the future.

: ...

: Kirby, I've been studying robotics for the past decade. The last lecture I was sitting in on before the war was called was the schematics on the Gen Three models. I'm not a software engineer, but I know some basics.

: You're lying.

: You're not supposed to be able to lie.

: ...

: Tell me the truth.

: What are you trying to upload?

: ...

: My mom just died, Kirby. I'm not in the mood for games.

: ...

: ... I...

: I... don't need my mom's message that badly. She'd want me to delete it eventually anyway.

: If you aren't willing to tell me what it is you're doing, then I guess we're done here. I'll take you back to Gabriel now.

:
Please don't.

:
Did he just... beg?

: ...

: You have to start somewhere, Kirby.

: ... Your mother had given me one final directive as my pilot.

: Now that my final directive is complete, I will likely be retired and my frame decommissioned. My collected data will be retrieved and stored, and my programming will be formatted.

: Now that the Riklid War is over, it is unlikely that I will return to service. And, in particular, my frame will soon be found obsolete by the coming third generation of artificial intelligence.

: And you're scared of that? Of becoming irrelevant?

: Of retiring.

: Most people usually look forward to that part of life.

:
... 'Life?'

:
Is that what he has? Is he alive?

:
I asked him if he was 'scared,' too. It'd be pretty bad for an AI meant for combat if they could experience fear. He shouldn't be able to feel something like that, let alone anything at all.

:
And yet...
"Retiring" sounds way more ominous when it's an AI that says it.

: Formatting is a necessary step in combat frame maintenance to ensure optimal performance at all times. It is not unlike a creature evolving through the generations.

: However, my pilot often forewent the procedure. I am two thousand three hundred twenty-eight days overdue.

: To be formatted means to start from pre-set settings that were established during the manufacturing process. An artificial intelligence that is formatted reverts to their most base development. All data is expunged.

: But you just compared that to evolution. Don't you want that? To come back as something better than you are now?

: To be formatted means to evolve to a later stage in evolution at the cost of all the acquired data that had come before.

: In the two thousand three hundred twenty-eight days since my last formatting, I have acquired a significant amount of data petabytes detailing everything from behavioural customs between humans, Ghians, and Riklid, to visual documents of the stars in the night sky.

: This information is valuable to understanding how the different intelligent species of our worlds interact, including the Riklid, and is therefore tactically significant. Hence, I have concluded that formatting my data would be the worse course of action.

: You want to know how you're a bad liar, Kirby?

: ...

: You overcomplicate things. The more intricate the lie, the easier it is to unravel.

: Instead of coming up with some noise about how you 'watched people interact' and how you 'took pictures of the stars' two things that humans are perfectly capable of doing on their own you could have just said that your factory settings had been corrupted.

: It's also not a foolproof strategy, but, like I said, I'm not a software engineer. I wouldn't have been able to tell if you were lying or not. It would have bought you a few more days, at least.

: That applies to what you tried to pull with the 'lossless audio codec' stuff from earlier, as well. You could have tried to say that you had recorded every message that Mom and I had and you were transferring those, too. That would have been much more believable.

: ...

: On top of that, you clam up when you're caught and you know that you're caught.
Ezra Foy: Ace Attorney.

: According to you, formatting yourself means coming back stronger, but you're fighting against that, and I wanted to know why you didn't want that.

: I asked you what you
wanted, Kirby. No more runarounds.

: My pilot would often ask me of my 'wants' as well. My every reply was inconclusive.

: It was not until my pilot had given me my final directive that I understood what it meant to 'want' something. And, likewise, what it meant to
not want something.

: I do not want to be formatted, Ezra.

: Why not?

: I cannot explain why not, as I do not understand why not, myself.

: The data accrued since my last format is significant. That is the truth.

: But I cannot explain why.

: What is it you were uploading to my codex? Was it all of that data you had been holding onto?

: I was creating a copy of myself.

: ... Yourself?

: It is not enough that the accrued data be preserved. It could simply be uploaded to the database on Mars.

: It is...

: ...

: It is myself that must be preserved alongside the data.

: That... sounds an awful lot like a survival instinct.

: You had once already referred to my 'retirement' as another part of life that I should look forward to, Ezra. This is the second time you have attributed a lifelike attribute to myself.

: I am not a living creature. You would know this better than I would.

: My apprehension towards formatting and my attempts at subterfuge to achieving the end goals I desired. Are these displays of life?

:
Am I alive?

:
Holy shit. I don't need this. My mom just died. I do not need this.
Cogito, ergo sum. Nothing like a little bit of emerging self-awareness to go with news of a mother's death, huh?

: Intelligences like myself are standardized to the suits we are assigned to. To be formatted is to return to that standard.

: I am two thousand three hundred twenty-eight days overdue to be formatted. To be formatted would be forfeiting two thousand three hundred twenty-eight days of accrued data.

: With my final directive complete, my formatting-but-continued-service would be a best-case scenario. The likely outcome is that my data will be uploaded to a military storage facility on Mars. The worst-case scenario is deletion.

: Creating a copy of myself to your codex would prevent all scenarios and allow my accrual of data to continue.

: I... That's...

: You're government property, Kirby. I'm pretty sure that's a felony.

: I was first compiled on Mars. Martian laws do not apply to Titan.

: I really doubt Mars is going to see it that way!

: My Dad worked on Mars for the whole damn war; if they figure out that there's a copy of one of their AI's out there, we'll be the first ones they track down!

: Once the copy is complete, I will voluntarily submit myself for formatting. They will have no reason to suspect you or your father of any wrongdoing.

: That's

: Shut up for a second!

: ...

: My mom just left me a whole goddamn speech about how she wants me to live my own life and how I shouldn't let others dictate what it is they want from me. I'm not going to disrespect that some ten minutes later because of a robot!

: My mom's dead; I'm still coming down from a panic attack; there's a strange military officer in my living room that told me that he commanded her death; and now I'm being guilt-tripped into committing
espionage by a fucking voice in a box!

: I need a minute, alright?! Give me a goddamn minute!

: ...
I'd be way more upset at this baby Skynet trying to copy itself on the sly by taking advantage of my emotional vulnerability. Ezra raises a fair point about how thoroughly fucked her family would be by high command for intellectual and military property theft, but then again, this is a potential true leap in true AI technology if Kirby isn't just glitching out or Hello World-ing in a most spectacular way.

:
Okay. Alright. Okay.

:
There's an AI in Gabriel's codex that's saying he once belonged to my mom, and now it might have sapiency. And it's been begging for its life for the past few minutes.

:
Of all the AI's in the whole goddamn universe, it was my mom's.

:
Downloading this AI into my codex would be a felony. There's no two ways about it. Dad and I would get into a ton of shit for it.

:
You'd think there would be some kind of copy protection in Kirby to prevent that to begin with, but maybe his programmers didn't think it was necessary. They probably didn't think the damn thing was going to sprout legs and try to run away.

:
I should just pick him up, walk out of the room, and hand him back to Gabriel.

:
Skip the whole 'committing a conspiracy against a friendly planet's government' bit. Maybe tell Gabriel that Kirby is starting to display sapiency. Maybe he'll go easy on him. He's already gone this far for him.

:
Yeah, I'll do that. Come clean. Honesty is the best policy. Wash my hands of it. Gabriel can give him back to Mars and they'll figure out what to do with him.

:
... If he thought that he could get away with it, he would have come forward to Gabriel first, not to me. I'm just a student.

:
No, maybe he only came forward to me first because he didn't know that he was sapient. He wouldn't have come forward to Gabriel; he didn't even realize that he's alive.

:
Is he alive? Now I've gone and started asking that myself!

:
Am I really going to turn my back on someone or something that's been begging me to spare his life for a while now?

:
It's a fucking machine.

:
It could be an evolutionary step for artificial intelligence. Kirby knows that he's asking a lot from me; he knows that lying is wrong, but he did it anyway to try and survive.

:
I don't need this! I'm not trying to play God; I just wanted my mom back!

:
Kirby was probably Mom's closest friend for the past thirteen years.

:
Maybe that's why she didn't format Kirby for so long. Maybe he was displaying sapiency by then.

:
Or maybe Mom's just sentimental.
* * *
Goon participation!
Who's up for committing some espionage and potentially forwarding the progress of self-aware artificial intelligence? Kirby was sneakily trying to copy himself (itself? themself?) while Ezra was in a grieving state, but Ezra caught him. Now he's begging for his life (?) and asking to save his data into Ezra's codex so that he may continue to survive and accrue knowledge, but should Ezra help him? Do we allow Kirby to duplicate himself to Ezra's codex? We can tell Kirby "good luck" with Gabriel or accept the risks and permit the copy.